Welcome to P K Kelkar Library, Online Public Access Catalogue (OPAC)

Normal view MARC view ISBD view

The theory of linear prediction

By: Vaidyanathan, P. P.
Material type: materialTypeLabelBookSeries: Synthesis lectures on signal processing: #3.Publisher: San Rafael, Calif. (1537 Fourth Street, San Rafael, CA 94901 USA) : Morgan & Claypool Publishers, c2008Description: 1 electronic text (xiv, 183 p. : ill.) : digital file.ISBN: 1598295764 (electronic bk.); 9781598295764 (electronic bk.); 1598295756 (pbk.); 9781598295757 (pbk.).Uniform titles: Synthesis digital library of engineering and computer science. Subject(s): Prediction theory | Linear prediction theory | Vector linear prediction | Linear estimation | Filtering | Smoothing | Line spectral processes | Levinson's recursion | Lattice structures | AutoregressivemodelsDDC classification: 519.287 Online resources: Abstract with links to resource | Abstract with links to full text Available also in print.
Contents:
1. Introduction -- 1.1. History of linear prediction -- 1.2. Scope and outline -- 2. The optimal linear prediction problem -- 2.1. Introduction -- 2.2. Prediction error and prediction polynomial -- 2.3. The normal equations -- 2.4. Properties of the autocorrelation matrix -- 2.5. Estimating the autocorrelation -- 2.6. Concluding remarks -- 3. Levinson's recursion -- 3.1. Introduction -- 3.2. Derivation of Levinson's recursion -- 3.3. Simple properties of Levinson's recursion -- 3.4. The whitening effect -- 3.5. Concluding remarks -- 4. Lattice structures for linear prediction -- 4.1. Introduction -- 4.2. The backward predictor -- 4.3. Lattice structures -- 4.4. Concluding remarks -- 5. Autoregressive modeling -- 5.1. Introduction -- 5.2. Autoregressive processes -- 5.3. Approximation by an AR(N) processes -- 5.4. Autocorrelation matching property -- 5.5. Power spectrum of the AR model -- 5.6. Application in signal compression -- 5.7. MA and ARMA processes -- 5.8. Summary -- 6. Prediction error bound and spectral flatness -- 6.1. Introduction -- 6.2. Prediction error for an AR process -- 6.3. A measure of spectral flatness -- 6.4. Spectral flatness of an AR process -- 6.5. Case where signal is not AR -- 6.6. Maximum entropy and linear prediction -- 6.7. Concluding remarks -- 7. Line spectral processes -- 7.1. Introduction -- 7.2. Autocorrelation of a line spectral process -- 7.3. Time domain descriptions -- 7.4. Further properties of time domain descriptions -- 7.5. Prediction polynomial of line spectral processes -- 7.6. Summary of properties -- 7.7. Identifying a line spectral process in noise -- 7.8. Line spectrum pairs -- 7.9. Concluding remarks -- 8. Linear prediction theory for vector processes -- 8.1. Introduction -- 8.2. Formulation of the vector LPC problem -- 8.3. Normal equations : vector case -- 8.4. Backward prediction -- 8.5. Levinson's recursion : vector case -- 8.6. Properties derived from Levinson's recursion -- 8.7. Transfer matrix functions in vector LPC -- 8.8. The FIR lattice structure for vector LPC -- 8.9. The IIR lattice structure for vector LPC -- 8.10. The normalized IIR lattice -- 8.11. The paraunitary or MIMO all-pass property -- 8.12. Whitening effect and stalling -- 8.13. Properties of transfer matrices in LPC theory -- 8.14. Concluding remarks -- A. Linear estimation of random variables -- A.1. The orthogonality principle -- A.2. Closed-form solution -- A.3. Consequences of orthogonality -- A.4. Singularity of the autocorrelation matrix -- B. Proof of a property of autocorrelations -- C. Stability of the inverse filter -- D. Recursion satisfied by AR autocorrelations -- Problems -- References -- Author biography -- Index.
Summary: Linear prediction theory has had a profound impact in the field of digital signal processing. Although the theory dates back to the early 1940s, its influence can still be seen in applications today. The theory is based on very elegant mathematics and leads to many beautiful insights into statistical signal processing. Although prediction is only a part of the more general topics of linear estimation, filtering, and smoothing, this book focuses on linear prediction. This has enabled detailed discussion of a number of issues that are normally not found in texts. For example, the theory of vector linear prediction is explained in considerable detail and so is the theory of line spectral processes. This focus and its small size make the book different from many excellent texts which cover the topic, including a few that are actually dedicated to linear prediction. There are several examples and computer-based demonstrations of the theory. Applications are mentioned wherever appropriate, but the focus is not on the detailed development of these applications. The writing style is meant to be suitable for self-study as well as for classroom use at the senior and first-year graduate levels. The text is self-contained for readers with introductory exposure to signal processing, random processes, and the theory of matrices, and a historical perspective and detailed outline are given in the first chapter.
    average rating: 0.0 (0 votes)
Item type Current location Call number Status Date due Barcode Item holds
E books E books PK Kelkar Library, IIT Kanpur
Available EBKE043
Total holds: 0

Mode of access: World Wide Web.

System requirements: Adobe Acrobat reader.

Part of: Synthesis digital library of engineering and computer science.

Series from website.

Includes bibliographical references (p. 171-175) and index.

1. Introduction -- 1.1. History of linear prediction -- 1.2. Scope and outline -- 2. The optimal linear prediction problem -- 2.1. Introduction -- 2.2. Prediction error and prediction polynomial -- 2.3. The normal equations -- 2.4. Properties of the autocorrelation matrix -- 2.5. Estimating the autocorrelation -- 2.6. Concluding remarks -- 3. Levinson's recursion -- 3.1. Introduction -- 3.2. Derivation of Levinson's recursion -- 3.3. Simple properties of Levinson's recursion -- 3.4. The whitening effect -- 3.5. Concluding remarks -- 4. Lattice structures for linear prediction -- 4.1. Introduction -- 4.2. The backward predictor -- 4.3. Lattice structures -- 4.4. Concluding remarks -- 5. Autoregressive modeling -- 5.1. Introduction -- 5.2. Autoregressive processes -- 5.3. Approximation by an AR(N) processes -- 5.4. Autocorrelation matching property -- 5.5. Power spectrum of the AR model -- 5.6. Application in signal compression -- 5.7. MA and ARMA processes -- 5.8. Summary -- 6. Prediction error bound and spectral flatness -- 6.1. Introduction -- 6.2. Prediction error for an AR process -- 6.3. A measure of spectral flatness -- 6.4. Spectral flatness of an AR process -- 6.5. Case where signal is not AR -- 6.6. Maximum entropy and linear prediction -- 6.7. Concluding remarks -- 7. Line spectral processes -- 7.1. Introduction -- 7.2. Autocorrelation of a line spectral process -- 7.3. Time domain descriptions -- 7.4. Further properties of time domain descriptions -- 7.5. Prediction polynomial of line spectral processes -- 7.6. Summary of properties -- 7.7. Identifying a line spectral process in noise -- 7.8. Line spectrum pairs -- 7.9. Concluding remarks -- 8. Linear prediction theory for vector processes -- 8.1. Introduction -- 8.2. Formulation of the vector LPC problem -- 8.3. Normal equations : vector case -- 8.4. Backward prediction -- 8.5. Levinson's recursion : vector case -- 8.6. Properties derived from Levinson's recursion -- 8.7. Transfer matrix functions in vector LPC -- 8.8. The FIR lattice structure for vector LPC -- 8.9. The IIR lattice structure for vector LPC -- 8.10. The normalized IIR lattice -- 8.11. The paraunitary or MIMO all-pass property -- 8.12. Whitening effect and stalling -- 8.13. Properties of transfer matrices in LPC theory -- 8.14. Concluding remarks -- A. Linear estimation of random variables -- A.1. The orthogonality principle -- A.2. Closed-form solution -- A.3. Consequences of orthogonality -- A.4. Singularity of the autocorrelation matrix -- B. Proof of a property of autocorrelations -- C. Stability of the inverse filter -- D. Recursion satisfied by AR autocorrelations -- Problems -- References -- Author biography -- Index.

Abstract freely available; full-text restricted to subscribers or individual document purchasers.

Compendex

INSPEC

Google scholar

Google book search

Linear prediction theory has had a profound impact in the field of digital signal processing. Although the theory dates back to the early 1940s, its influence can still be seen in applications today. The theory is based on very elegant mathematics and leads to many beautiful insights into statistical signal processing. Although prediction is only a part of the more general topics of linear estimation, filtering, and smoothing, this book focuses on linear prediction. This has enabled detailed discussion of a number of issues that are normally not found in texts. For example, the theory of vector linear prediction is explained in considerable detail and so is the theory of line spectral processes. This focus and its small size make the book different from many excellent texts which cover the topic, including a few that are actually dedicated to linear prediction. There are several examples and computer-based demonstrations of the theory. Applications are mentioned wherever appropriate, but the focus is not on the detailed development of these applications. The writing style is meant to be suitable for self-study as well as for classroom use at the senior and first-year graduate levels. The text is self-contained for readers with introductory exposure to signal processing, random processes, and the theory of matrices, and a historical perspective and detailed outline are given in the first chapter.

Available also in print.

Title from PDF t.p. (viewed Oct. 19, 2008).

There are no comments for this item.

Log in to your account to post a comment.

Powered by Koha